Fixed-point algorithms for learning determinantal point processes
نویسندگان
چکیده
Determinantal point processes (DPPs) offer an elegant tool for encoding probabilities over subsets of a ground set. Discrete DPPs are parametrized by a positive semidefinite matrix (called the DPP kernel), and estimating this kernel is key to learning DPPs from observed data. We consider the task of learning the DPP kernel, and develop for it a surprisingly simple yet effective new algorithm. Our algorithm offers the following benefits over previous approaches: (a) it is much simpler; (b) it yields equally good and sometimes even better local maxima; and (c) it runs an order of magnitude faster on large problems. We present experimental results on both real and simulated data to illustrate the numerical performance of our technique.
منابع مشابه
Determinantal point processes
In this survey we review two topics concerning determinantal (or fermion) point processes. First, we provide the construction of diffusion processes on the space of configurations whose invariant measure is the law of a determinantal point process. Second, we present some algorithms to sample from the law of a determinantal point process on a finite window. Related open problems are listed.
متن کاملDeterminantal Point Processes for Machine Learning
determinantal point processes for machine learning is available in our digital library an online access to it is set as public so you can get it instantly. Our books collection hosts in multiple countries, allowing you to get the most less latency time to download any of our books like this one. Merely said, the determinantal point processes for machine learning is universally compatible with a...
متن کاملExpectation-Maximization for Learning Determinantal Point Processes
A determinantal point process (DPP) is a probabilistic model of set diversity compactly parameterized by a positive semi-definite kernel matrix. To fit a DPP to a given task, we would like to learn the entries of its kernel matrix by maximizing the log-likelihood of the available data. However, log-likelihood is non-convex in the entries of the kernel matrix, and this learning problem is conjec...
متن کاملOn new faster fixed point iterative schemes for contraction operators and comparison of their rate of convergence in convex metric spaces
In this paper we present new iterative algorithms in convex metric spaces. We show that these iterative schemes are convergent to the fixed point of a single-valued contraction operator. Then we make the comparison of their rate of convergence. Additionally, numerical examples for these iteration processes are given.
متن کاملKronecker Determinantal Point Processes
Determinantal Point Processes (DPPs) are probabilistic models over all subsets a ground set of N items. They have recently gained prominence in several applications that rely on “diverse” subsets. However, their applicability to large problems is still limited due to theO(N) complexity of core tasks such as sampling and learning. We enable efficient sampling and learning for DPPs by introducing...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015